I'm applying the SFS
from mlxtend
to a Keras neural net wrapped as in sklearn api. Currently, I'm seemingly stuck at an input-dimension mismatch.
Below is the mlxtend portion of the code, where I tried to ensure the input dimension fits the requirement of the first layer of the network.
# Wrap Keras nn and generating SFS object
skwrapped_model = KerasClassifier(build_fn=make_model,
train_full_input=train_predictor,
epochs=EPOCHS,
batch_size=BATCH_SIZE,
validation_split=1-TRAIN_TEST_SPLIT,
verbose=0)
sffs = SFS(skwrapped_model,
k_features=(1, train_predictor.shape[1]),
floating=True,
clone_estimator=False,
cv=0,
n_jobs=1,
scoring='f1_macro')
# Apply SFS to identify best feature subset
sffs = sffs.fit(train_predictor,
train_response)
Despite my input having a exactly 45 columns, however, I'm still getting the error ValueError: Input 0 of layer sequential is incompatible with the layer: expected axis -1 of input shape to have value 45 but received input with shape [None, 1]
Detailed error log suggests my input somehow got converted into a 2-D array of 0s during processing.
My neural network model is define as
def make_model(train_full_input, output_bias=None):
if output_bias is not None:
# Incorporate initial guess to speed out convergence
output_bias = tf.keras.initializers.Constant(output_bias)
# 1 ReLU layer + 1 Dropout layer + 1 softmax layer for 3-cat classification
model = keras.Sequential([
keras.layers.Dense(16,
activation='relu',
input_shape=((train_full_input.shape[-1]),)),
keras.layers.Dropout(0.5),
keras.layers.Dense(3,
activation='softmax',
bias_initializer=output_bias)
])
model.compile(optimizer=keras.optimizers.Adam(lr=1e-3),
loss= macro_soft_f1)
return model
EPOCHS = 100
# Use a large batch size ensures that each batch is likely to contain a few
# minority classes from the imbalanced input.
BATCH_SIZE = 2048
Any helps are appreciated!
question from:
https://stackoverflow.com/questions/65950224/applying-mlxtend-to-kerasclassifier-leads-to-valueerror 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…