Deep Learning: Technical Introduction
by Thomas Epelbaum
Publisher: arXiv.org 2017
Number of pages: 106
Description:
This note presents in a technical though hopefully pedagogical way the three most common forms of neural network architectures: Feedforward, Convolutional and Recurrent. For each network, their fundamental building blocks are detailed. The forward pass and the update rules for the backpropagation algorithm are then derived in full.
Download or read it online for free here:
Download link
(2.2MB, PDF)
Similar books
![Book cover: The Matrix Calculus You Need For Deep Learning](images/blank.gif)
by Terence Parr, Jeremy Howard - arXiv.org
This paper is an attempt to explain all the matrix calculus you need in order to understand the training of deep neural networks. We assume no knowledge beyond what you learned in calculus 1, and provide links to help you refresh the necessary math.
(5329 views)
![Book cover: Deep Learning](images/10150.jpg)
by Yoshua Bengio, Ian Goodfellow, Aaron Courville - MIT Press
This book can be useful for the university students learning about machine learning and the practitioners of machine learning, artificial intelligence, data-mining and data science aiming to better understand and take advantage of deep learning.
(16886 views)
![Book cover: Deep Learning in Neural Networks: An Overview](images/10196.jpg)
by Juergen Schmidhuber - arXiv
In recent years, deep artificial neural networks (including recurrent ones) have won numerous contests in pattern recognition and machine learning. This historical survey compactly summarises relevant work, much of it from the previous millennium.
(10217 views)
![Book cover: Neural Networks and Deep Learning](images/10060.jpg)
by Michael Nielsen
Neural networks and deep learning currently provide the best solutions to many problems in image recognition, speech recognition, and natural language processing. This book will teach you the core concepts behind neural networks and deep learning.
(9928 views)