具有 keras 调谐器的相关超参数

问题描述

我的目标是调整满足以下条件的可能网络架构:

  1. 第 1 层可以具有此列表中任意数量的隐藏单元:[32、64、128、256、512]

然后,其余层要探索的隐藏单元的数量应始终取决于在其上方的层中所做的特定选择,特别是:

  1. 第 2 层可以具有与第 1 层相同或一半的单元。
  2. 第 3 层可以具有与第 2 层相同或一半的单元。
  3. 第 4 层可以具有与第 3 层相同或一半的单元。

由于我目前正在实施它,第 2、3 和 4 层的 hp.Choice 选项在第一次建立后永远不会更新。

例如,假设在调谐器 num_layers = 4 的第一次通过时,这意味着将创建所有四个层。例如,如果第 1 层选择 256 个隐藏单元,则选项变为:

第 2 层 --> [128,256]

第 3 层 --> [64,128]

第 4 层 --> [32,64]

第 2、3 和 4 层在接下来的每次迭代中都坚持这些选择,而不是更新以适应第 1 层的未来选择。

这意味着在未来的迭代中,当第 1 层中隐藏单元的数量发生变化时,第 2、3 和 4 层的选项不再满足探索选项的预期目标,其中每个后续层可以包含相同或一半的选项隐藏单元作为前一层。

def build_and_tune_model(hp,train_ds,normalize_features,ohe_features,max_tokens,passthrough_features):
    
    all_inputs,encoded_features = get_all_preprocessing_layers(train_ds,normalize_features=normalize_features,ohe_features=ohe_features,max_tokens=max_tokens,passthrough=passthrough_features)

    
    
    # Possible values for the number of hidden units in layer 1.
    # Defining here because we will always have at least 1 layer.
    layer_1_hidden_units = hp.Choice('layer1_hidden_units',values=[32,64,128,256,512])

    # Possible number of layers to include
    num_layers = hp.Choice('num_layers',values=[1,2,3,4])
    
    print("================= starting new round =====================")
    print(f"Layer 1 hidden units = {hp.get('layer1_hidden_units')}")
    print(f"Num layers is {hp.get('num_layers')}")
    
    
    all_features = layers.concatenate(encoded_features)
    
    x = layers.Dense(layer_1_hidden_units,activation="relu")(all_features)

    
    if hp.get('num_layers') >= 2:
        
        with hp.conditional_scope("num_layers",[2,4]):
            
            # Layer 2 hidden units can either be half the layer 1 hidden units or the same.
            layer_2_hidden_units = hp.Choice('layer2_hidden_units',values=[(int(hp.get('layer1_hidden_units') / 2)),hp.get('layer1_hidden_units')])

            
            print("\n==========================================================")
            print(f"In layer 2")
            print(f"num_layers param = {hp.get('num_layers')}")
            print(f"layer_1_hidden_units = {hp.get('layer1_hidden_units')}")
            print(f"layer_2_hidden_units = {hp.get('layer2_hidden_units')}")
            print("==============================================================\n")

            x = layers.Dense(layer_2_hidden_units,activation="relu")(x)

    if hp.get('num_layers') >= 3:
        
        with hp.conditional_scope("num_layers",[3,4]):
        
            # Layer 3 hidden units can either be half the layer 2 hidden units or the same.
            layer_3_hidden_units = hp.Choice('layer3_hidden_units',values=[(int(hp.get('layer2_hidden_units') / 2)),hp.get('layer2_hidden_units')])


            print("\n==========================================================")
            print(f"In layer 3")
            print(f"num_layers param = {hp.get('num_layers')}")
            print(f"layer_1_hidden_units = {hp.get('layer1_hidden_units')}")
            print(f"layer_2_hidden_units = {hp.get('layer2_hidden_units')}")
            print(f"layer_3_hidden_units = {hp.get('layer3_hidden_units')}")
            print("==============================================================\n")

            x = layers.Dense(layer_3_hidden_units,activation="relu")(x)

    if hp.get('num_layers') >= 4:
        
        with hp.conditional_scope("num_layers",[4]):
        
            # Layer 4 hidden units can either be half the layer 3 hidden units or the same.
            # Extra stipulation applied here,layer 4 hidden units can never be less than 8.
            layer_4_hidden_units = hp.Choice('layer4_hidden_units',values=[max(int(hp.get('layer3_hidden_units') / 2),8),hp.get('layer3_hidden_units')])


            print("\n==========================================================")
            print(f"In layer 4")
            print(f"num_layers param = {hp.get('num_layers')}")
            print(f"layer_1_hidden_units = {hp.get('layer1_hidden_units')}")
            print(f"layer_2_hidden_units = {hp.get('layer2_hidden_units')}")
            print(f"layer_3_hidden_units = {hp.get('layer3_hidden_units')}")
            print(f"layer_4_hidden_units = {hp.get('layer4_hidden_units')}")
            print("==============================================================\n")

            x = layers.Dense(layer_4_hidden_units,activation="relu")(x)

    
    output = layers.Dense(1,activation='sigmoid')(x)
    
    model = tf.keras.Model(all_inputs,output)
    
    model.compile(optimizer=tf.keras.optimizers.Adam(),metrics = ['accuracy'],loss='binary_crossentropy')
    
    print(">>>>>>>>>>>>>>>>>>>>>>>>>>>> End of round <<<<<<<<<<<<<<<<<<<<<<<<<<<<<")
    
    return model

有谁知道告诉Keras Tuner探索每一层隐藏单元的所有可能选项的正确方法,其中探索的区域满足第一层之后的每一层允许具有相同或一半的隐藏单元的标准和前一层一样,第一层可以有多个隐藏单元从列表[32,512]?

解决方法

暂无找到可以解决该程序问题的有效方法,小编努力寻找整理中!

如果你已经找到好的解决方法,欢迎将解决方案带上本链接一起发送给小编。

小编邮箱:dio#foxmail.com (将#修改为@)