1 Rectified Linear Unit (ReLU) is a non-linear activation function. Max pooling layer applies a max pooling operation to its inputs. 2 Fully Connected (FC) layers follow a stack of convolutional layers with different depth.