Accuracy and cost rise left to right. So does setup time.
These are three points on the spectrum of how much task-specific information you give the model (Brown et al., "Language Models are Few-Shot Learners," OpenAI, 2020).
Zero-shot relies entirely on pre-training. Fine-tuning permanently adapts weights. Few-shot is the middle ground — shown examples shape behavior for that one request.
Classify this review as positive or negative: "Loved it!"
The model pattern-matches from pre-training.
Review: "Amazing product" -> positive
Review: "Waste of money" -> negative
Review: "Loved it!" -> ?
Examples anchor the format and edge cases.
Upload 1000+ labeled review pairs to OpenAI / Anthropic / open-source training script. Model weights update. You now query without any examples and get the fine-tuned behavior.
| Need | Approach |
|---|---|
| Prototype quickly | Zero-shot |
| Consistent format / edge cases | Few-shot |
| High volume, latency-sensitive, specific style | Fine-tuning |
| Fresh data changes often | Zero-shot + RAG |
| Tiny output space (classify into 10 categories) | Fine-tuning |
How many examples count as few-shot? Typically 1-10. Beyond that, diminishing returns — fine-tuning becomes viable.
Does few-shot cost more per request? Yes — examples eat tokens. At scale, fine-tuning often wins on cost.
Is fine-tuning worth it? Only if zero-shot + few-shot cannot hit accuracy, OR you have >100K requests/month where per-request savings matter.
Can I combine approaches? Yes — fine-tune for style, then RAG for facts, then few-shot for format.
What is instruction tuning? A specific fine-tuning that teaches models to follow instructions. All modern chatbots are instruction-tuned.
Can open-source models be fine-tuned cheaply? Yes — LoRA / QLoRA fine-tunes 7B models on a single GPU for ~$5-50.
Does fine-tuning cause forgetting? Yes — models can lose general capability. Monitor regressions.
Start zero-shot. Add few-shot when format slips. Fine-tune only when zero-shot + few-shot + RAG hit a wall. Read more patterns on Misar Blog.
Free newsletter
Join thousands of creators and builders. One email a week — practical AI tips, platform updates, and curated reads.
No spam · Unsubscribe anytime
A complete list of 25 free AI writing tools in 2026 — Claude, ChatGPT, Gemini, Grammarly, QuillBot, Hemingway, and more…
The top free AI image generators in 2026 — DALL-E via Bing, Gemini, Ideogram, Leonardo, Stable Diffusion, Flux — with qu…
The top free AI tools for nonprofits in 2026 — grant writing, donor outreach, social posts, translations, research — wit…
Comments
Sign in to join the conversation
No comments yet. Be the first to share your thoughts!