by Synced 2017-09-25 1 AI Research A Brief Overview of Attention Mechanism Attention is simply a vector, often the outputs of dense layer using softmax function.
by Synced 2017-08-25 Number of comments0 Research Memory, Attention, Sequences In the article “Memory, attention, sequences”, the author predicts that future work on neural networks will emphasize understanding complex spatio-temporal data from the real world, which is highly contextual and noisy.
by Synced 2017-02-08 Number of comments0 Research Talk Review Geoffrey Hinton: Using Fast Weights to Store Temporary Memories Machine learning Advances and Applications Seminar address how to use fast weights to effectively store temporary memories, at University of Toronto