How to hypertune parameters in multilayer perceptron neural network algorithm?
check_parameters = { 'hidden_layer_sizes': [(50,50), (100,50)], 'activation': ['tanh', 'relu'], 'solver': ['sgd', 'adam'], 'alpha': [0.0001, 0.05], 'learning_rate': ['constant','adaptive'], }