Deep Learning: Technical Introduction
by Thomas Epelbaum
Publisher: arXiv.org 2017
Number of pages: 106
Description:
This note presents in a technical though hopefully pedagogical way the three most common forms of neural network architectures: Feedforward, Convolutional and Recurrent. For each network, their fundamental building blocks are detailed. The forward pass and the update rules for the backpropagation algorithm are then derived in full.
Download or read it online for free here:
Download link
(2.2MB, PDF)
Similar books

by Michael Nielsen
Neural networks and deep learning currently provide the best solutions to many problems in image recognition, speech recognition, and natural language processing. This book will teach you the core concepts behind neural networks and deep learning.
(10795 views)

by Yoshua Bengio, Ian Goodfellow, Aaron Courville - MIT Press
This book can be useful for the university students learning about machine learning and the practitioners of machine learning, artificial intelligence, data-mining and data science aiming to better understand and take advantage of deep learning.
(18027 views)

by Juergen Schmidhuber - arXiv
In recent years, deep artificial neural networks (including recurrent ones) have won numerous contests in pattern recognition and machine learning. This historical survey compactly summarises relevant work, much of it from the previous millennium.
(11198 views)

by Terence Parr, Jeremy Howard - arXiv.org
This paper is an attempt to explain all the matrix calculus you need in order to understand the training of deep neural networks. We assume no knowledge beyond what you learned in calculus 1, and provide links to help you refresh the necessary math.
(6159 views)