Chain-of-Thought Prompting is a Few-Shot prompting technique in which the user feeds the LLM a prompt that demands a thought process before giving the final answer. Usually, an example of the thought process is given before asking for an answer (hence the Few-Shot nature of the prompt). A common example is:
Q: Jack has two baskets, each containing three balls. How many balls does Jack have in total?
A: One basket contains 3 balls, so two baskets contain 3 * 2 = 6 balls.
Q: [the question for inference]
A:
Valeriia Kuka, Head of Content at Learn Prompting, is passionate about making AI and ML accessible. Valeriia previously grew a 60K+ follower AI-focused social media account, earning reposts from Stanford NLP, Amazon Research, Hugging Face, and AI researchers. She has also worked with AI/ML newsletters and global communities with 100K+ members and authored clear and concise explainers and historical articles.
Wei, J., Wang, X., Schuurmans, D., Bosma, M., Ichter, B., Xia, F., Chi, E., Le, Q., & Zhou, D. (2022). Chain of Thought Prompting Elicits Reasoning in Large Language Models. β©