Keras autotuner will allow us to automate the configuration process of our neural network.
When we want to iterate over multiple configurations for our model (testing different layers, neurons, epochs, learning rate and so on) we will not have to do it manually model by model, this tool will allow us to load and run different formulas to compare their performances.

Implementing the autotuner
The Keras autotuner is not loaded by default in Google Colab, so we must install it via command line.
python !pip install -q -U keras-tuner
To use it we will import it as kerastuner.
python import kerastuner as kt from tensorflow import keras
For this occasion we will create a new model constructor, this will receive as parameters a tuner object that will determine the variations of different hyperparameters.
We will define a general architecture, where we will add a convolution layer, Max Pooling and flattening in a fixed way, then we will determine the first variable of the constructor: The amount of neurons in the following hidden layer, will be initialized in 32 and will increase up to 512 giving jumps of 32 in 32.
The number of neurons in the next layer will be the iterator object. The rest of the network will remain stable.
Finally we will define variations in the learning rate, where we will start the model with 3 possible learning rates: 0.01, 0.001 and 0.0001.
When compiling the model we will define Adam as the optimizer, however, we will call the class directly and give it the iterator object. All other parameters will remain the same.
````python def constructor_models(hp): model = tf.keras.models.Sequential() model.add(tf.keras.layers.Conv2D(75, (3,3), activation = "relu", input_shape = (28,28,1))) model.add(tf.keras.layers.MaxPooling2D((2,2))) model.add(tf.keras.layers.Flatten()))
hp_units = hp.Int("units", min_value = 32, max_value = 512, step = 32) model.add(tf.keras.layers.Dense(units = hp_units, activation = "relu", kernel_regularizer = regularizers.l2(1e-5))) model.add(tf.keras.layers.Dropout(0.2))) model.add(tf.keras.layers.Dense(128, activation = "relu", kernel_regularizer = regularizers.l2(1e-5))) model.add(tf.keras.layers.Dropout(0.2)) model.add(tf.keras.layers.Dense(len(classes), activation = "softmax"))
hp_learning_rate = hp.Choice("learning_rate", values = [1e-2, 1e-3, 1e-4])
model.compile(optimizer = keras.optimizers.Adam(learning_rate=hp_learning_rate), loss = "categorical_crossentropy", metrics = ["accuracy"])
return model ```
This function will be the raw material of the tuner, which will test all the combinatorics to find the most optimal model.
Contribution created by Sebasti谩n Franco G贸mez.
Want to see more contributions, questions and answers from the community?