Two-week intensive curriculum

LLM and AI Study Guide

This guide is built for 12 to 14 hours of study per day over two weeks. Each subject page is intentionally dense: it gives you a beginner track, an advanced engineering track, practical examples, Python code, diagrams, and a concrete to-do list covering what to learn, practice, and build before moving on.

06. Vector databases

Approximate nearest neighbor search, indexing, filtering, and storage design.

07. Embeddings

Dense representations, similarity, domain adaptation, and practical evaluation.

11. Multilingual NLP

Cross-lingual transfer, multilingual embeddings, and language-specific pitfalls.

18. Word embeddings

Word2Vec, GloVe, fastText, analogies, and limitations of static embeddings.

21. Transformers

Modern language modeling architecture, scaling, and training mechanics.

31. My Notes

Personal study notes: LLM fine-tuning, CNNs, RNNs, Transformers, attention, hyperparameters, KV cache, and more.

31b. Notes Exam

50 multiple-choice questions covering neural networks, attention, embeddings, training, and interview-focused topics.

32. Loss Functions

MSE, MAE, Huber, binary cross-entropy, categorical cross-entropy, hinge, KL divergence, contrastive, and triplet loss.

33. Deep Learning Fundamentals

Neurons, activation functions, backpropagation, CNNs, ResNet, transfer learning, Transformers, BERT, GPT, and generative models.