问题描述
我首先使用keras 1.2.0在cpu上运行了相同的代码(具有相同的数据),然后在两个代码中都使用了keras 2.0.3 keras是使用TensorFlow后端的。 问题是来自Conv1D,许多参数都发生了变化,我想从keras 1.2.0复制相同的Conv1D。因为我与keras 2没有相同的结果。
这是我在Keras 1上的代码
def core_model_CNN(sequence_input,sequence_length,vocabulary_size,n_out,embedding_dim,embedding_matrix,filter_sizes = [1,2,3],num_filters = 100,drop = 0.1) :
embedding = Embedding(input_dim=vocabulary_size,output_dim=embedding_dim,input_length=sequence_length,weights=[embedding_matrix],trainable=False)
embedded_sequences = embedding(sequence_input)
filter_sizes = filter_sizes
convs = []
for fsz in filter_sizes:
conv = Conv1D(nb_filter=32,filter_length=fsz,border_mode='valid',activation='relu',subsample_length=1)(embedded_sequences)
pool = MaxPooling1D(pool_length=sequence_length-fsz+1)(conv)
flattenMax = Flatten()(pool)
convs.append(flattenMax)
l_merge = concatenate(convs,axis=1)
#flatten = Flatten()(l_merge)
dense1= Dense(300,activation='swish')(l_merge)
dense1=Batchnormalization()(dense1)
dense1= Dense(250,activation='swish')(dense1)
dense1=Batchnormalization()(dense1)
dense1= Dense(200,activation='swish')(dense1)
dense1=Batchnormalization()(dense1)
dense1= Dense(150,activation='swish')(dense1)
dense1=Batchnormalization()(dense1)
dense1= Dense(100,activation='swish')(dense1)
dense1=Batchnormalization()(dense1)
output = Dense(units=n_out,activation='softmax',kernel_regularizer=regularizers.l2(2),)(dense1)
return output
在Keras 2上,它变成
conv = tf.keras.layers.Conv1D(filters=params['nb_filter'],kernel_size=fsz,padding='valid',bias_initializer='zeros',strides=1)(embedded_sequences)
pool = tf.keras.layers.MaxPooling1D(pool_size=sequence_length-fsz+1)(conv)
解决方法
暂无找到可以解决该程序问题的有效方法,小编努力寻找整理中!
如果你已经找到好的解决方法,欢迎将解决方案带上本链接一起发送给小编。
小编邮箱:dio#foxmail.com (将#修改为@)