正确预处理一维CNN的CSV数据

问题描述

在准备数据集以馈入一维CNN时遇到问题。

我的CSV具有3025个列,代表单个字节+最后一个列作为字符串标签

也许不是预处理问题,而是我的网络模型。

这是我的模特

def cnn_1d(num_classes):
    model = models.Sequential()
    model.add(Conv1D(16,kernel_size=3,strides=1,activation="relu",input_shape=(3025,1)))
    model.add(Conv1D(16,activation="relu"))
    model.add(MaxPooling1D(pool_size=2))
    model.add(Dropout(0.2))
    model.add(Conv1D(32,activation="relu"))
    model.add(Conv1D(32,activation="relu"))
    model.add(MaxPooling1D(pool_size=2))
    model.add(Dropout(0.2))
    model.add(Dense(500,activation="relu"))
    model.add(Dense(300,activation="relu"))
    model.add(Dense(num_classes,activation="softmax"))
    model.compile(
        optimizer=keras.optimizers.Adam(1e-3),loss="categorical_crossentropy",metrics=["accuracy"],)
    model.summary()
    return model

这是我尝试预处理的数据:

num_classes = 4
df = pd.read_csv("test.csv",header=0)

df["label"] = pd.Categorical(df["label"])
df["label"] = df.label.cat.codes

Y = df.pop("label")
X = df.copy()

x_train,x_test,y_train,y_test = train_test_split(np.asarray(X),np.asarray(Y),test_size=0.33,shuffle=True)

x_train = np.reshape(x_train,(x_train.shape[0],x_train.shape[1],1))
x_test = np.reshape(x_test,(x_test.shape[0],x_test.shape[1],1))

model = cnn_1d(num_classes)
model.fit(x_train,epochs=100,batch_size=64,validation_data=(x_test,y_test))

我认为由于未正确处理标签,我在最后一个密集层上收到了一个错误。这个我

 ValueError: Shapes (None,1) and (None,753,4) are incompatible

我缺少什么?我所知道的是,最后一个Dense层应具有num类作为单位计数(在我的示例中为4)。

解决方法

这是您上面显示的代码的模型摘要:

Model: "sequential_1"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
conv1d (Conv1D)              (None,3023,16)          64        
_________________________________________________________________
conv1d_1 (Conv1D)            (None,3021,16)          784       
_________________________________________________________________
max_pooling1d (MaxPooling1D) (None,1510,16)          0         
_________________________________________________________________
dropout (Dropout)            (None,16)          0         
_________________________________________________________________
conv1d_2 (Conv1D)            (None,1508,32)          1568      
_________________________________________________________________
conv1d_3 (Conv1D)            (None,1506,32)          3104      
_________________________________________________________________
max_pooling1d_1 (MaxPooling1 (None,753,32)           0         
_________________________________________________________________
dropout_1 (Dropout)          (None,32)           0         
_________________________________________________________________
dense (Dense)                (None,500)          16500     
_________________________________________________________________
dense_1 (Dense)              (None,300)          150300    
_________________________________________________________________
dense_2 (Dense)              (None,4)            1204      
=================================================================
Total params: 173,524
Trainable params: 173,524
Non-trainable params: 0

输出层的尺寸为(批,序列长度,4个类)。您可能打算在第二个max_pooling层之后展平矩阵。

如果这样做,我会得到一个参数较少的模型,并将输出4个类之一...

def cnn_1d(num_classes):
    model = models.Sequential()
    model.add(Conv1D(16,kernel_size=3,strides=1,activation="relu",input_shape=(3025,1)))
    model.add(Conv1D(16,activation="relu"))
    model.add(MaxPooling1D(pool_size=2))
    model.add(Dropout(0.2))
    model.add(Conv1D(32,activation="relu"))
    model.add(Conv1D(32,activation="relu"))
    model.add(MaxPooling1D(pool_size=2))
    model.add(Flatten())
    model.add(Dropout(0.2))
    model.add(Dense(500,activation="relu"))
    model.add(Dense(300,activation="relu"))
    model.add(Dense(num_classes,activation="softmax"))
    model.compile(
        optimizer=keras.optimizers.Adam(1e-3),loss="categorical_crossentropy",metrics=["accuracy"],)
    model.summary()
    return model
  
cnn_1d(4)

Model: "sequential_2"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
conv1d_4 (Conv1D)            (None,16)          64        
_________________________________________________________________
conv1d_5 (Conv1D)            (None,16)          784       
_________________________________________________________________
max_pooling1d_2 (MaxPooling1 (None,16)          0         
_________________________________________________________________
dropout_2 (Dropout)          (None,16)          0         
_________________________________________________________________
conv1d_6 (Conv1D)            (None,32)          1568      
_________________________________________________________________
conv1d_7 (Conv1D)            (None,32)          3104      
_________________________________________________________________
max_pooling1d_3 (MaxPooling1 (None,32)           0         
_________________________________________________________________
flatten (Flatten)            (None,24096)             0         
_________________________________________________________________
dropout_3 (Dropout)          (None,24096)             0         
_________________________________________________________________
dense_3 (Dense)              (None,500)               12048500  
_________________________________________________________________
dense_4 (Dense)              (None,300)               150300    
_________________________________________________________________
dense_5 (Dense)              (None,4)                 1204      
=================================================================
Total params: 12,205,524
Trainable params: 12,524
Non-trainable params: 0

作为奖励,此模型的可训练参数大大减少。

,

我弄清楚了我所缺少的:

  1. 我错过了在y_ *变量上使用to_categorical。我认为这与df["label"] = pd.Categorical(df["label"])是绝对的。所以在添加模型之前:
    y_train = to_categorical(y_train,4)
    y_test = to_categorical(y_test,4)
    
  2. 我忘了在MaxPool1D最后一层之后对输出进行展平。

现在它可以正常工作了。