Dart posted on Hacker News and is live on Launch YC today only—check it out!

Optimize an LLM Prompt Template

Optimize an LLM prompt the right way using our template

About the optimize an LLM prompt template

Large language models or LLMs have reached mainstream popularity and allowing people to accomplish their tasks with greater clarity and precision. To make the most of LLMs it's important to use good prompts that work well for your use case. This template will guide you through the process of crafting and using LLMs prompts the right way.

Optimize an LLM prompt template content

This template indicates a step-by-step process for optimizing your LLM prompt. Start by clearly defining the problem you want the LLM to solve. Write down any specific inputs and outputs or for open-ended prompts think about what good responses should look like. Then write clear instructions with examples. The more details and specificity the better the results. After this test it out in a playground to see how it works and make necessary adjustments. If the prompt is hallucinating tell it to use a reference text or to cite its sources. Split complex tasks into simple steps and make multiple queries. If the content gets very long, summarize it before giving it to the model. After this tell the model to take time to think or to check its own work, and test it against your success criteria. Finally fine-tune the model if desired.