Package 'transformer'

Title: Implementation of Transformer Deep Neural Network with Vignettes
Description: Transformer is a Deep Neural Network Architecture based i.a. on the Attention mechanism (Vaswani et al. (2017) <doi:10.48550/arXiv.1706.03762>).
Authors: Bastiaan Quast [aut, cre]
Maintainer: Bastiaan Quast <[email protected]>
License: MIT + file LICENSE
Version: 0.2.0
Built: 2024-11-04 03:02:51 UTC
Source: https://github.com/bquast/transformer

Help Index


Feed Forward Layer

Description

Feed Forward Layer

Usage

feed_forward(x, dff, d_model)

Arguments

x

inputs

dff

dimensions of feed-forward model

d_model

dimensions of the model

Value

output of the feed-forward layer


Layer Normalization

Description

Layer Normalization

Usage

layer_norm(x, epsilon = 1e-06)

Arguments

x

inputs

epsilon

scale

Value

outputs of layer normalization


Multi-Headed Attention

Description

Multi-Headed Attention

Usage

multi_head(Q, K, V, d_model, num_heads, mask = NULL)

Arguments

Q

queries

K

keys

V

values

d_model

dimensions of the model

num_heads

number of heads

mask

optional mask

Value

multi-headed attention outputs


Row Means

Description

Row Means

Usage

row_means(x)

Arguments

x

matrix

Value

vector with the mean of each of row of the input matrix

Examples

row_means(t(matrix(1:5)))

Row Variances

Description

Row Variances

Usage

row_vars(x)

Arguments

x

matrix

Value

vector with the variance of each of row of the input matrix

Examples

row_vars(t(matrix(1:5)))

Transformer

Description

Transformer

Usage

transformer(x, d_model, num_heads, dff, mask = NULL)

Arguments

x

inputs

d_model

dimensions of the model

num_heads

number of heads

dff

dimensions of feed-forward model

mask

optional mask

Value

output of the transformer layer

Examples

x <- matrix(rnorm(50 * 512), 50, 512)
d_model <- 512
num_heads <- 8
dff <- 2048

output <- transformer(x, d_model, num_heads, dff)