Toggle navigation sidebar
Toggle in-page Table of Contents
Oddly Satisfying Deep Learning
Introduction
1. Preliminaries
1.1. Data Preprocessing
1.2. Performance Metrics for ML and DL models
2. Multilayer Perceptrons
2.1. Activation Functions
2.2. Perceptron
2.3. Terminologies Part-1
2.4. Cost functions
2.5. Forward propagation
2.6. Back Propagation
2.7. Terminologies Part-2
2.8. Gradient Descent
2.9. Regularization
2.10. Dropout regularization
2.11. Batch Normalization
2.12. Numerical example Forward and Back pass
2.13. Shortcut to calculate forward pass and backpropagation across layers
2.14. MLP model from scratch in Python
2.15. 4 step process to build MLP model using PyTorch
2.16. MLP model using Tensorflow - Keras
3. Convolutional Neural Networks
3.1. Convolutional Neural Networks over MLP
3.2. Basic Architecture of CNN
3.2.1. Convolutional layers
3.2.2 Forward Propagation Convolution layer (Vectorized)
3.2.3 Backward Propagation Convolution layer (Vectorized)
3.2.4. Pooling layers
3.3. Convolutional Neural Networks from scratch in Python
3.4. 4 step process to build a CNN model using PyTorch
3.5. CNN model using Tensorflow - Keras
3.6. State of the art CNN models
4. Word Embeddings
4.1. Traditional Word Embeddings
4.2. Static Word Embeddings
4.2.1. Word2Vec
4.2.2 GloVe
4.2.3. FastText
4.3. Contextual Word Embeddings
4.3.1. Embeddings from Language Models (ELMo)
repository
open issue
Index