Build an LLM from Scratch 7: Instruction Finetuning
Sebastian Raschka
Build an LLM from Scratch 7: Instruction Finetuning
1:46:04
Build an LLM from Scratch 6: Finetuning for Classification
Sebastian Raschka
Build an LLM from Scratch 6: Finetuning for Classification
2:15:29
Build an LLM from Scratch 5: Pretraining on Unlabeled Data
Sebastian Raschka
Build an LLM from Scratch 5: Pretraining on Unlabeled Data
2:36:44
Build an LLM from Scratch 4: Implementing a GPT model from Scratch To Generate Text
Sebastian Raschka
Build an LLM from Scratch 4: Implementing a GPT model from Scratch To Generate Text
1:45:37
Build an LLM from Scratch 3: Coding attention mechanisms
Sebastian Raschka
Build an LLM from Scratch 3: Coding attention mechanisms
2:15:41
Build an LLM from Scratch 2: Working with text data
Sebastian Raschka
Build an LLM from Scratch 2: Working with text data
1:28:01
Build an LLM from Scratch 1: Set up your code environment
Sebastian Raschka
Build an LLM from Scratch 1: Set up your code environment
21:02
Reinforcement Learning with Human Feedback (RLHF) in 4 minutes
Sebastian Raschka
Reinforcement Learning with Human Feedback (RLHF) in 4 minutes
4:06
LLMs: A Journey Through Time and Architecture
Sebastian Raschka
LLMs: A Journey Through Time and Architecture
19:44
Building LLMs from the Ground Up: A 3-hour Coding Workshop
Sebastian Raschka
Building LLMs from the Ground Up: A 3-hour Coding Workshop
2:45:10
Understanding PyTorch Buffers
Sebastian Raschka
Understanding PyTorch Buffers
13:33
Developing an LLM: Building, Training, Finetuning
Sebastian Raschka
Developing an LLM: Building, Training, Finetuning
58:46
Managing Sources of Randomness When Training Deep Neural Networks
Sebastian Raschka
Managing Sources of Randomness When Training Deep Neural Networks
23:11
Insights from Finetuning LLMs with Low-Rank Adaptation
Sebastian Raschka
Insights from Finetuning LLMs with Low-Rank Adaptation
13:49
Finetuning Open-Source LLMs
Sebastian Raschka
Finetuning Open-Source LLMs
20:05
Scaling PyTorch Model Training With Minimal Code Changes
Sebastian Raschka
Scaling PyTorch Model Training With Minimal Code Changes
15:25
L13.5 What's The Difference Between Cross-Correlation And Convolution?
Sebastian Raschka
L13.5 What's The Difference Between Cross-Correlation And Convolution?
10:38
Conditional Ordinal Regression for Neural Networks (CORN) With Examples in PyTorch
Sebastian Raschka
Conditional Ordinal Regression for Neural Networks (CORN) With Examples in PyTorch
28:33
The Three Elements of PyTorch
Sebastian Raschka
The Three Elements of PyTorch
56:58
Ratings and Rankings -- Using Deep Learning When Class Labels Have A Natural Order
Sebastian Raschka
Ratings and Rankings -- Using Deep Learning When Class Labels Have A Natural Order
14:59
13.4.5 Sequential Feature Selection -- Code Examples (L13: Feature Selection)
Sebastian Raschka
13.4.5 Sequential Feature Selection -- Code Examples (L13: Feature Selection)
23:36
13.4.4 Sequential Feature Selection (L13: Feature Selection)
Sebastian Raschka
13.4.4 Sequential Feature Selection (L13: Feature Selection)
30:00
13.4.3 Feature Permutation Importance Code Examples (L13: Feature Selection)
Sebastian Raschka
13.4.3 Feature Permutation Importance Code Examples (L13: Feature Selection)
27:38
13.4.2 Feature Permutation Importance (L13: Feature Selection)
Sebastian Raschka
13.4.2 Feature Permutation Importance (L13: Feature Selection)
16:56
13.4.1 Recursive Feature Elimination (L13: Feature Selection)
Sebastian Raschka
13.4.1 Recursive Feature Elimination (L13: Feature Selection)
28:52
13.3.2 Decision Trees & Random Forest Feature Importance (L13: Feature Selection)
Sebastian Raschka
13.3.2 Decision Trees & Random Forest Feature Importance (L13: Feature Selection)
39:43
13.3.1 L1-regularized Logistic Regression as Embedded Feature Selection (L13: Feature Selection)
Sebastian Raschka
13.3.1 L1-regularized Logistic Regression as Embedded Feature Selection (L13: Feature Selection)
23:33
13.2 Filter Methods for Feature Selection -- Variance Threshold (L13: Feature Selection)
Sebastian Raschka
13.2 Filter Methods for Feature Selection -- Variance Threshold (L13: Feature Selection)
19:53
13.1 The Different Categories of Feature Selection (L13: Feature Selection)
Sebastian Raschka
13.1 The Different Categories of Feature Selection (L13: Feature Selection)
11:39
13.0 Introduction to Feature Selection (L13: Feature Selection)
Sebastian Raschka
13.0 Introduction to Feature Selection (L13: Feature Selection)
16:10
Introduction to Generative Adversarial Networks (Tutorial Recording at ISSDL 2021)
Sebastian Raschka
Introduction to Generative Adversarial Networks (Tutorial Recording at ISSDL 2021)
1:28:26
Designing Generative Adversarial Networks for Privacy-enhanced Face Recognition (Conference rec.)
Sebastian Raschka
Designing Generative Adversarial Networks for Privacy-enhanced Face Recognition (Conference rec.)
34:39
L19.5.2.2 GPT-v1: Generative Pre-Trained Transformer
Sebastian Raschka
L19.5.2.2 GPT-v1: Generative Pre-Trained Transformer
9:54
L19.5.2.4 GPT-v2: Language Models are Unsupervised Multitask Learners
Sebastian Raschka
L19.5.2.4 GPT-v2: Language Models are Unsupervised Multitask Learners
9:03
L19.5.2.7: Closing Words -- The Recent Growth of Language Transformers
Sebastian Raschka
L19.5.2.7: Closing Words -- The Recent Growth of Language Transformers
6:10
L19.5.2.6 BART:  Combining Bidirectional and Auto-Regressive Transformers
Sebastian Raschka
L19.5.2.6 BART: Combining Bidirectional and Auto-Regressive Transformers
10:15
L19.5.2.5 GPT-v3: Language Models are Few-Shot Learners
Sebastian Raschka
L19.5.2.5 GPT-v3: Language Models are Few-Shot Learners
6:41
L19.6 DistilBert Movie Review Classifier in PyTorch -- Code Example
Sebastian Raschka
L19.6 DistilBert Movie Review Classifier in PyTorch -- Code Example
17:58
L19.5.2.3 BERT: Bidirectional Encoder Representations from Transformers
Sebastian Raschka
L19.5.2.3 BERT: Bidirectional Encoder Representations from Transformers
18:31
L19.5.2.1 Some Popular Transformer Models: BERT, GPT, and BART -- Overview
Sebastian Raschka
L19.5.2.1 Some Popular Transformer Models: BERT, GPT, and BART -- Overview
8:41
L19.5.1 The Transformer Architecture
Sebastian Raschka
L19.5.1 The Transformer Architecture
22:36
L19.4.3 Multi-Head Attention
Sebastian Raschka
L19.4.3 Multi-Head Attention
7:37
L19.4.2 Self-Attention and Scaled Dot-Product Attention
Sebastian Raschka
L19.4.2 Self-Attention and Scaled Dot-Product Attention
16:09
L19.4.1 Using Attention Without the RNN -- A Basic Form of Self-Attention
Sebastian Raschka
L19.4.1 Using Attention Without the RNN -- A Basic Form of Self-Attention
16:11
L19.3 RNNs with an Attention Mechanism
Sebastian Raschka
L19.3 RNNs with an Attention Mechanism
22:19
L19.2.1 Implementing a Character RNN in PyTorch (Concepts)
Sebastian Raschka
L19.2.1 Implementing a Character RNN in PyTorch (Concepts)
9:20
L19.2.2 Implementing a Character RNN in PyTorch --Code Example
Sebastian Raschka
L19.2.2 Implementing a Character RNN in PyTorch --Code Example
25:57
L19.1 Sequence Generation with Word and Character RNNs
Sebastian Raschka
L19.1 Sequence Generation with Word and Character RNNs
17:44
L19.0 RNNs & Transformers for Sequence-to-Sequence Modeling -- Lecture Overview
Sebastian Raschka
L19.0 RNNs & Transformers for Sequence-to-Sequence Modeling -- Lecture Overview
3:05
L18.6: A DCGAN for Generating Face Images in PyTorch -- Code Example
Sebastian Raschka
L18.6: A DCGAN for Generating Face Images in PyTorch -- Code Example
12:43
L18.5: Tips and Tricks to Make GANs Work
Sebastian Raschka
L18.5: Tips and Tricks to Make GANs Work
17:14
L18.4: A GAN for Generating Handwritten Digits in PyTorch -- Code Example
Sebastian Raschka
L18.4: A GAN for Generating Handwritten Digits in PyTorch -- Code Example
22:46
L18.3: Modifying the GAN Loss Function for Practical Use
Sebastian Raschka
L18.3: Modifying the GAN Loss Function for Practical Use
18:50
L18.2: The GAN Objective
Sebastian Raschka
L18.2: The GAN Objective
26:26
L18.1: The Main Idea Behind GANs
Sebastian Raschka
L18.1: The Main Idea Behind GANs
10:43
L18.0: Introduction to Generative Adversarial Networks -- Lecture Overview
Sebastian Raschka
L18.0: Introduction to Generative Adversarial Networks -- Lecture Overview
5:15
L17.7 VAE Latent Space Arithmetic in PyTorch -- Making People Smile (Code Example)
Sebastian Raschka
L17.7 VAE Latent Space Arithmetic in PyTorch -- Making People Smile (Code Example)
11:54
L17.6 A Variational Autoencoder for Face Images in PyTorch -- Code Example
Sebastian Raschka
L17.6 A Variational Autoencoder for Face Images in PyTorch -- Code Example
10:06
L17.5 A Variational Autoencoder for Handwritten Digits in PyTorch -- Code Example
Sebastian Raschka
L17.5 A Variational Autoencoder for Handwritten Digits in PyTorch -- Code Example
23:13
L17.4 Variational Autoencoder Loss Function
Sebastian Raschka
L17.4 Variational Autoencoder Loss Function
12:16
L17.3 The Log-Var Trick
Sebastian Raschka
L17.3 The Log-Var Trick
7:35
L17.2 Sampling from a Variational Autoencoder
Sebastian Raschka
L17.2 Sampling from a Variational Autoencoder
9:27
L17.1 Variational Autoencoder Overview
Sebastian Raschka
L17.1 Variational Autoencoder Overview
5:24
L17.0 Intro to Variational Autoencoders -- Lecture Overview
Sebastian Raschka
L17.0 Intro to Variational Autoencoders -- Lecture Overview
3:16
L16.5 Other Types of Autoencoders
Sebastian Raschka
L16.5 Other Types of Autoencoders
5:34
L16.4 A Convolutional Autoencoder in PyTorch -- Code Example
Sebastian Raschka
L16.4 A Convolutional Autoencoder in PyTorch -- Code Example
15:21
L16.3 Convolutional Autoencoders & Transposed Convolutions
Sebastian Raschka
L16.3 Convolutional Autoencoders & Transposed Convolutions
16:08
L16.2 A Fully-Connected Autoencoder
Sebastian Raschka
L16.2 A Fully-Connected Autoencoder
16:35
L16.1 Dimensionality Reduction
Sebastian Raschka
L16.1 Dimensionality Reduction
9:40
L16.0 Introduction to Autoencoders -- Lecture Overview
Sebastian Raschka
L16.0 Introduction to Autoencoders -- Lecture Overview
4:45
L15.7 An RNN Sentiment Classifier in PyTorch
Sebastian Raschka
L15.7 An RNN Sentiment Classifier in PyTorch
40:00
L15.6 RNNs for Classification: A Many-to-One Word RNN
Sebastian Raschka
L15.6 RNNs for Classification: A Many-to-One Word RNN
29:07
L15.5 Long Short-Term Memory
Sebastian Raschka
L15.5 Long Short-Term Memory
16:58
L15.4 Backpropagation Through Time Overview
Sebastian Raschka
L15.4 Backpropagation Through Time Overview
9:34
L15.3 Different Types of Sequence Modeling Tasks
Sebastian Raschka
L15.3 Different Types of Sequence Modeling Tasks
4:32
L15.2 Sequence Modeling with RNNs
Sebastian Raschka
L15.2 Sequence Modeling with RNNs
13:40
L15.1: Different Methods for Working With Text Data
Sebastian Raschka
L15.1: Different Methods for Working With Text Data
15:58
L15.0: Introduction to Recurrent Neural Networks -- Lecture Overview
Sebastian Raschka
L15.0: Introduction to Recurrent Neural Networks -- Lecture Overview
3:59
L14.6.2 Transfer Learning in PyTorch -- Code Example
Sebastian Raschka
L14.6.2 Transfer Learning in PyTorch -- Code Example
11:36
L14.6.1 Transfer Learning
Sebastian Raschka
L14.6.1 Transfer Learning
7:39
L14.5 Convolutional Instead of Fully Connected Layers
Sebastian Raschka
L14.5 Convolutional Instead of Fully Connected Layers
14:33
L14.4.2 All-Convolutional Network in PyTorch -- Code Example
Sebastian Raschka
L14.4.2 All-Convolutional Network in PyTorch -- Code Example
8:17
L14.4.1 Replacing Max-Pooling with Convolutional Layers
Sebastian Raschka
L14.4.1 Replacing Max-Pooling with Convolutional Layers
8:19
Deep Learning News #10, Apr 3 2021
Sebastian Raschka
Deep Learning News #10, Apr 3 2021
20:55
L14.3.2.2 ResNet-34 in PyTorch -- Code Example
Sebastian Raschka
L14.3.2.2 ResNet-34 in PyTorch -- Code Example
18:48
L14.3.2.1 ResNet Overview
Sebastian Raschka
L14.3.2.1 ResNet Overview
14:42
L14.3.1.2 VGG16 in PyTorch -- Code Example
Sebastian Raschka
L14.3.1.2 VGG16 in PyTorch -- Code Example
15:52
L14.3.1.1 VGG16 Overview
Sebastian Raschka
L14.3.1.1 VGG16 Overview
6:06
L14.3: Architecture Overview
Sebastian Raschka
L14.3: Architecture Overview
3:24
L14.2: Spatial Dropout and BatchNorm
Sebastian Raschka
L14.2: Spatial Dropout and BatchNorm
6:46
L14.1: Convolutions and Padding
Sebastian Raschka
L14.1: Convolutions and Padding
11:14
L14.0: Convolutional Neural Networks Architectures -- Lecture Overview
Sebastian Raschka
L14.0: Convolutional Neural Networks Architectures -- Lecture Overview
6:18
Deep Learning News #9, Mar 27 2021
Sebastian Raschka
Deep Learning News #9, Mar 27 2021
28:10
L13.9.3 AlexNet in PyTorch
Sebastian Raschka
L13.9.3 AlexNet in PyTorch
15:16
L13.9.2 Saving and Loading Models in PyTorch
Sebastian Raschka
L13.9.2 Saving and Loading Models in PyTorch
5:45
L13.9.1 LeNet-5 in PyTorch
Sebastian Raschka
L13.9.1 LeNet-5 in PyTorch
13:12
L13.8 What a CNN Can See
Sebastian Raschka
L13.8 What a CNN Can See
13:43
L13.7 CNN Architectures & AlexNet
Sebastian Raschka
L13.7 CNN Architectures & AlexNet
20:17
L13.6 CNNs & Backpropagation
Sebastian Raschka
L13.6 CNNs & Backpropagation
5:54
Deep Learning News #8 Mar 20 2021
Sebastian Raschka
Deep Learning News #8 Mar 20 2021
18:03