Network structure inspired by simplified models of biological neurons (brain cells).
I am manually creating my dataset from a number of 384x286 b/w images. I load an image like this: …
python tensorflow neural-network keras convolutionAlthough both of the above methods provide a better score for the better closeness of prediction, still cross-entropy is preferred. …
machine-learning neural-network backpropagation mean-square-error cross-entropyI know that, in the 1D case, the convolution between two vectors, a and b, can be computed as conv(…
neural-network deep-learning conv-neural-network matrix-multiplication convolutionI'm currently trying to get an ANN to play a video game and and I was hoping to get some …
machine-learning computer-vision neural-network video-processing reinforcement-learningThe introductory documentation, which I am reading (TOC here) uses the term "batch" (for instance here) without having defined it.
tensorflow machine-learning neural-network deep-learning tensorI am trying to grasp what TimeDistributed wrapper does in Keras. I get that TimeDistributed "applies a layer to every …
python machine-learning keras neural-network deep-learningI'm trying to use deep learning to predict income from 15 self reported attributes from a dating site. We're getting rather …
tensorflow machine-learning neural-network keras classificationI understand that Batch Normalisation helps in faster training by turning the activation towards unit Gaussian distribution and thus tackling …
machine-learning neural-network computer-vision conv-neural-network batch-normalizationLooking at an example 'solver.prototxt', posted on BVLC/caffe git, there is a training meta parameter weight_decay: 0.04 What …
machine-learning neural-network deep-learning caffe gradient-descentIn the tensorflow MNIST tutorial the mnist.train.next_batch(100) function comes very handy. I am now trying to implement …
python numpy neural-network tensorflow classification