🟢 Introduction to Few-Shot Prompting Techniques
Last updated on September 27, 2024 by Valeriia Kuka
Welcome to the few-shot section of the advanced Prompt Engineering Guide.
While zero-shot prompting is the most basic form of interaction—where the large language model (LLM) few-shot prompting techniques it a step further. Few-shot prompting provides the model with example pairs of problems and their correct solutions. These examples help the model better understand the context and improve its response generation.
Here are a couple of techniques we've already explored, with more on the way!
- K-Nearest Neighbor (KNN) Prompting selects relevant examples by finding the most similar cases to the input query, improving the accuracy of few-shot prompts.
- Self-Ask Prompting breaks down complex questions into sub-questions, and helps LLMs reason more effectively and provide better answers.
- Prompt Mining selects the optimal prompt template for a given task from a corpus of text based on the template that comes up most often in the corpus.
- Vote-K Prompting selects diverse and representative exemplars from unlabeled datasets for few-shot prompts.
Stay tuned for more advanced techniques, including:
Get AI Certified by Learn Prompting