Package: attention 0.4.0
attention: Self-Attention Algorithm
Self-Attention algorithm helper functions and demonstration vignettes of increasing depth on how to construct the Self-Attention algorithm, this is based on Vaswani et al. (2017) <doi:10.48550/arXiv.1706.03762>, Dan Jurafsky and James H. Martin (2022, ISBN:978-0131873216) <https://web.stanford.edu/~jurafsky/slp3/> "Speech and Language Processing (3rd ed.)" and Alex Graves (2020) <https://www.youtube.com/watch?v=AIiwuClvH6k> "Attention and Memory in Deep Learning".
Authors:
attention_0.4.0.tar.gz
attention_0.4.0.zip(r-4.5)attention_0.4.0.zip(r-4.4)attention_0.4.0.zip(r-4.3)
attention_0.4.0.tgz(r-4.4-any)attention_0.4.0.tgz(r-4.3-any)
attention_0.4.0.tar.gz(r-4.5-noble)attention_0.4.0.tar.gz(r-4.4-noble)
attention_0.4.0.tgz(r-4.4-emscripten)attention_0.4.0.tgz(r-4.3-emscripten)
attention.pdf |attention.html✨
attention/json (API)
NEWS
# Install 'attention' in R: |
install.packages('attention', repos = c('https://bquast.r-universe.dev', 'https://cloud.r-project.org')) |
Bug tracker:https://github.com/bquast/attention/issues
Last updated 1 years agofrom:64e8f97e0f. Checks:OK: 7. Indexed: yes.
Target | Result | Date |
---|---|---|
Doc / Vignettes | OK | Nov 08 2024 |
R-4.5-win | OK | Nov 08 2024 |
R-4.5-linux | OK | Nov 08 2024 |
R-4.4-win | OK | Nov 08 2024 |
R-4.4-mac | OK | Nov 08 2024 |
R-4.3-win | OK | Nov 08 2024 |
R-4.3-mac | OK | Nov 08 2024 |
Exports:attentionComputeWeightsRowMaxSoftMax
Dependencies:
Readme and manuals
Help Manual
Help page | Topics |
---|---|
Attnention mechanism | attention |
SoftMax sigmoid function | ComputeWeights |
Maximum of Matrix Rows | RowMax |
SoftMax sigmoid function | SoftMax |