Skip to main content
Back to top
Ctrl
+
K
Welcome to Introduction to Generative AI Workshop
Session 01
Slides on Attention & Transformers
Slides on LLMs
Building nanoGPT
Slides on nanoGPT
Bigram Model
Bigram Model with Linear layer & Token + Positional embeddings
Self Attention
Multiple Heads, Feedforward Layer
Residual Connections, Transformer Block
Projection, Layernorm, Dropout
Session 02
Slides on image generation
Slides on finetuning
Finetuning SDXL DreamBooth using AutoTrain
Repository
Open issue
Search
Error
Please activate JavaScript to enable the search functionality.
Ctrl
+
K