LLM Basic Notes

已完成

待完成

  • Token / Tokenization
  • Embedding
  • Self-Attention
  • Multi-Head Attention
  • Positional Encoding
  • FFN / Residual / LayerNorm
  • Pretraining / SFT / RLHF / DPO
  • KV Cache / Quantization / RAG / Agent