Simply finding a learning rate to undergo gradient descent will help minimize the loss of a neural network. However, there are additional methods that can make this process smoother, faster, and more accurate. The first technique is Stochastic Gradient Descent with Restarts (SGDR), a variant of learning rate annealing, which gradually decreases the learning rate... Continue Reading →
Gradient Descent and Learning Rate
In every neural network, there are many weights and biases that connect neurons between different layers. With the correct weights and biases, the neural network can do its job well. When training the neural network, we are trying to determine the best weights and biases to improve the performance of the neural network. This process... Continue Reading →
What is a Neural Network?
What are Neural Networks? This is a dog. And this is a cat. When our eyes see these two pictures, our mind immediately tells us what kind of animal is being shown. Easy, right? But what if you had to teach a machine to distinguish cats and dogs? If we rely solely on the logic-based... Continue Reading →