1. Scaling Laws for Neural Language Models
作者: Jared Kaplan et al.
2. Attention Is All You Need
作者: Ashish Vaswani et al.
3. Language Models are Few-Shot Learners
作者: Tom Brown et al.
4. Retrieval-Augmented Generation for Knowledge-Intensive NLP Tasks
作者: Patrick Lewis et al.
5. Constitutional AI: Harmlessness from AI Feedback
作者: Anthropic
6. LoRA: Low-Rank Adaptation of Large Language Models
作者: Edward Hu et al.
7. Chain-of-Thought Prompting Elicits Reasoning in Large Language Models
作者: Jason Wei et al.
8. Training Compute-Optimal Large Language Models
作者: Jordan Hoffmann et al.
9. GPT-4 Technical Report
作者: OpenAI