Prompting Techniques (Zero-shot, Few-shot, Chain-of-Thought)
Prompting techniques are structured methodologies that serve to guide the output of language models to achieve the desired level of accuracy. Zero-shot prompting: instructing the model without examples, relying solely on the task description. Few-shot prompting: providing a few examples (input-output pairs) in the prompt that serve as a template for the model to adapt the expected format and style. Chain-of-Thought (CoT): a technique that encourages the model to lay out its reasoning step by step ('Let's think step by step'), which dramatically improves the problem-solving rate for logical and mathematical tasks. These procedures enable maximizing model performance without needing to modify the weights (fine-tuning).