Package: transformer 0.2.0
transformer: Implementation of Transformer Deep Neural Network with Vignettes
Transformer is a Deep Neural Network Architecture based i.a. on the Attention mechanism (Vaswani et al. (2017) <doi:10.48550/arXiv.1706.03762>).
Authors:
transformer_0.2.0.tar.gz
transformer_0.2.0.zip(r-4.5)transformer_0.2.0.zip(r-4.4)transformer_0.2.0.zip(r-4.3)
transformer_0.2.0.tgz(r-4.4-any)transformer_0.2.0.tgz(r-4.3-any)
transformer_0.2.0.tar.gz(r-4.5-noble)transformer_0.2.0.tar.gz(r-4.4-noble)
transformer_0.2.0.tgz(r-4.4-emscripten)transformer_0.2.0.tgz(r-4.3-emscripten)
transformer.pdf |transformer.html✨
transformer/json (API)
# Install 'transformer' in R: |
install.packages('transformer', repos = c('https://bquast.r-universe.dev', 'https://cloud.r-project.org')) |
Bug tracker:https://github.com/bquast/transformer/issues
Last updated 1 years agofrom:5da363dfa2. Checks:OK: 7. Indexed: yes.
Target | Result | Date |
---|---|---|
Doc / Vignettes | OK | Nov 04 2024 |
R-4.5-win | OK | Nov 04 2024 |
R-4.5-linux | OK | Nov 04 2024 |
R-4.4-win | OK | Nov 04 2024 |
R-4.4-mac | OK | Nov 04 2024 |
R-4.3-win | OK | Nov 04 2024 |
R-4.3-mac | OK | Nov 04 2024 |
Exports:attentionrow_meansrow_varsSoftMaxtransformer
Dependencies:attention
Readme and manuals
Help Manual
Help page | Topics |
---|---|
Feed Forward Layer | feed_forward |
Layer Normalization | layer_norm |
Multi-Headed Attention | multi_head |
Row Means | row_means |
Row Variances | row_vars |
Transformer | transformer |