Rstudio Keras attention_layer 应用

问题描述

我几乎没有试图在 Rstudio 的库中找到一个(相当)最近为 Keras 引入的 layer_attention 示例。

我让网络以两种方式工作:LSTM(return_sequences = T) - Attention - LSTM(return_sequences = F)LSTM(return_sequences = T) - Attention - Flatten 在密集层之前。请注意,在我的代码中,layer_flatten 已被注释;您可以改为评论第二个 layer_lstm。两种方法输出一维张量,至少看起来符合 NN 输出的预期维度。

什么是正确的方法,或者更明智的方法在这方面不是很有经验......

library(keras)

rm(nn_model)

lstm_units <- 16L
lstm_seq_len <- 4L 
nfeatures <- 2L; final_diffs <- 1:3

inputs <- 
  layer_input(shape = list(lstm_seq_len,nfeatures))

lstm_output <- 
  inputs %>% 
  layer_lstm(
    input_shape = list(lstm_seq_len,nfeatures),units = lstm_units,activation = 'relu',return_sequences = T,stateful = F,name = 'lstm1'
  )

predictions <-
  layer_attention(
    inputs = list(lstm_output,lstm_output),use_scale = FALSE,causal = FALSE,batch_size = NULL,dtype = NULL,name = 'attention',trainable = T,weights = NULL
  ) %>%
  layer_lstm(
    input_shape = list(lstm_seq_len,return_sequences = F,name = 'lstm2'
  ) %>%
  #layer_flatten %>%
  layer_dense(units = 64L,activation = NULL,name = 'dense1') %>% 
  layer_batch_normalization(name = 'bn1') %>%  
  layer_activation(activation = "relu",name = 'act1') %>%  
  layer_dense(units = 32L,name = 'dense2') %>% 
  layer_batch_normalization(name = 'bn2') %>%  
  layer_activation(activation = "relu",name = 'act2') %>%  
  layer_dense(units = length(final_diffs),activation = 'softmax',name = 'dense3')

optimizer <- 
  optimizer_adam(lr = 1e-5)

nn_model <- 
  keras_model(inputs = inputs,outputs = predictions)

nn_model %>% 
  keras::compile(
    optimizer = optimizer,loss = 'categorical_crossentropy',metrics = 'categorical_accuracy'
  )

summary(nn_model)

predict(nn_model,array(runif(lstm_seq_len * nfeatures,1),dim = c(1,lstm_seq_len,nfeatures)))

解决方法

暂无找到可以解决该程序问题的有效方法,小编努力寻找整理中!

如果你已经找到好的解决方法,欢迎将解决方案带上本链接一起发送给小编。

小编邮箱:dio#foxmail.com (将#修改为@)