← 返回首页

📚 论文速递

2026年04月18日 · 10 篇精选论文

1. Scaling Laws for Neural Language Models

作者: Jared Kaplan et al.

arXiv: 2001.08361LLM

2. Attention Is All You Need

作者: Ashish Vaswani et al.

arXiv: 1706.03762Architecture

3. Language Models are Few-Shot Learners

作者: Tom Brown et al.

arXiv: 2005.14165LLM

4. Retrieval-Augmented Generation for Knowledge-Intensive NLP Tasks

作者: Patrick Lewis et al.

arXiv: 2005.11401RAG

5. Constitutional AI: Harmlessness from AI Feedback

作者: Anthropic

arXiv: 2212.08073Alignment

6. LoRA: Low-Rank Adaptation of Large Language Models

作者: Edward Hu et al.

arXiv: 2106.09685Fine-tuning

7. Chain-of-Thought Prompting Elicits Reasoning in Large Language Models

作者: Jason Wei et al.

arXiv: 2201.11903Prompting

8. Training Compute-Optimal Large Language Models

作者: Jordan Hoffmann et al.

arXiv: 2203.15555Training

9. GPT-4 Technical Report

作者: OpenAI

arXiv: 2303.08774LLM

10. Llama 2: Open Foundation and Fine-Tunable Chat Models

作者: Meta AI

arXiv: 2307.09288Open Source