Category: general

Vanishing And Exploding Gradient Problems

A look at the problem of vanishing or exploding gradients. Two of the common problems associated with training of deep neural networks using gradient-based learning methods and backpropagation"

Backpropagation In Convolutional Neural Networks

A closer look at the concept of weights sharing in convolutional neural networks (CNNs) and an insight on how this affects the forward and backward propagation while computing the gradients during training

Formulating The ReLu

A critical review of the rectified linear activation function (ReL) as an elementary unit of the modern deep neural network architecture