In further neural networks, particular recurrent neural networks, we may come across two other problems when the model is properly trained with gradient descent and backpropagation. The gradient is simply a vector. A vector generally speaking is actually a matrix from the ℝˆn x 1th dimension (It's got just one https://richardf813nsx2.blog-gold.com/profile