Week9 & 10: Neural Networks and Tensor Flow
Vanishing Gradient Problem- You got gradients that were so small, that the progress was extremely slow with the computing power available at that time.
Hand designed neural networks called deep neural networks
GPU in 2010 made it possible to learn from many millions of examples in a short time. With brute force they managed to get over the vanishing gradient problem.
Simple neural network consists of only one hidden layer.
BackProp is one of the oldest algorithms for learning neural networks.
Tensor Flow Linear Regression
Tensor in mathematics is an object that specifies a linear relationship. In terms of computation we are talking about a finite dimension objects.
In TensorFlow the term tensor corresponds to a k dimensional array of numbers.
1. 0 dimensional tensor is a scalar
2. 1 dimensional tensor is a vector
3. 2 dimensional tensor is a matrix
4. k dimensional tensor is a k dimensional numpy array
Each operation on tensors is a node in the tensors flow graph. Nodes take tensors as input and returns tensors as output
Tensor Flow Logistic Regression
Executing the Optimization in a session
'
Using TensorBoard to view Graph Structture
TensorFlow BaseAPI
TensorFlow Estimator API
Convolutional Neural Networks
There are neural networks with very specific architecture that are designed to do image/object recognition in 2D images.
Comments
Post a Comment