Image Recognition Model Using Keras - shefaas/PythonAI-BitDegree GitHub Wiki
General
- Model Type: Neural Network Sequential model. It means that it consists of a stack of layers.
- Layers: 6 layers:
- 2D Convolution Layer.
- 2D Max Pooling Layer.
- Flatter Layer.
- Dense Layer.
- Dropout Layer.
- Dense Layer.
2D Convolution Layer (Conv2D)
From Docs:
keras.layers.Conv2D(filters, kernel_size, strides=(1, 1), padding='valid', data_format=None, dilation_rate=(1, 1), activation=None, use_bias=True, kernel_initializer='glorot_uniform', bias_initializer='zeros', kernel_regularizer=None, bias_regularizer=None, activity_regularizer=None, kernel_constraint=None, bias_constraint=None)
Working Example:
Conv2D(32, (3, 3), input_shape=(32, 32, 3), activation=’relu’, padding=’same’, kernel_constraint=maxnorm(3))
- It creates a convolution kernel convolved with the input to produce a tensor of outputs.
- filters (32): the number of output filters in the convolution.
- kernel_size (3, 3): 3*3 convolution window.
- input_shape (32, 32, 3) which is 32*32 RGB image. The 3 refers to the three layers of RGB images. This argument should be specified for the first layer in the sequential model.
- activation (relu): activation function called Rectified Linear Unit. It’s the most used activation function.
- padding (same): padding for the kernel or window. same means …..
- Kernel_constraint (3): constraint function applied to the kernel. MaxNorm constraints the norm to be less than or equal to a specific value, which is 3.
2D Max Pooling Layer (MaxPooling2D)
From Docs:
keras.layers.MaxPooling2D(pool_size=(2, 2), strides=None, padding='valid', data_format=None)
Working Example:
MaxPooling2D(pool_size=(2, 2))
- Max pooling operation for spatial data.
- pool_size (2, 2): to downscale the two dimensions by half.
Dense Layer (Dense)
From Docs:
keras.layers.Dense(units, activation=None, use_bias=True, kernel_initializer='glorot_uniform', bias_initializer='zeros', kernel_regularizer=None, bias_regularizer=None, activity_regularizer=None, kernel_constraint=None, bias_constraint=None)
Working Example:
Dense(512, activation=’relu’, kernel_constraint=maxnorm(3))
- Regular densely-connected NN.
- units (512): dimensionality of output space.
- activation (softmax): activation function called Softmax where softmax normalization is applied.
Flatten Layer (Flatten)
- To flatten the output means transform it to one dimension.
Dropout Layer (Dropout)
From Docs:
keras.layers.Dropout(rate, noise_shape=None, seed=None)
Working Example:
Dropout(0.5)
- To dropout a fraction value from the input unit during training time to prevent overfitting.
- rate (0.5): the fraction value to be dropped out.