# -------------------------------------------- # CITATION file created with {cffr} R package # See also: https://docs.ropensci.org/cffr/ # -------------------------------------------- cff-version: 1.2.0 message: 'To cite package "attention" in publications use:' type: software license: GPL-3.0-or-later title: 'attention: Self-Attention Algorithm' version: 0.4.0 doi: 10.32614/CRAN.package.attention abstract: Self-Attention algorithm helper functions and demonstration vignettes of increasing depth on how to construct the Self-Attention algorithm, this is based on Vaswani et al. (2017) , Dan Jurafsky and James H. Martin (2022, ISBN:978-0131873216) "Speech and Language Processing (3rd ed.)" and Alex Graves (2020) "Attention and Memory in Deep Learning". authors: - family-names: Quast given-names: Bastiaan email: bquast@gmail.com orcid: https://orcid.org/0000-0002-2951-3577 repository: https://bquast.r-universe.dev commit: 64e8f97e0f114c836481882a78ad1f695b09ebeb contact: - family-names: Quast given-names: Bastiaan email: bquast@gmail.com orcid: https://orcid.org/0000-0002-2951-3577