Upstage

View Original

[Prompt Engineering - Part 1] Maximizing the Use of LLM with Prompt Design

2024/07/30 | Written By: Suwan Kim, Sungmin Park

(This content is based on part of the second session of Upstage's "ChatGPT UP for Everyone!" course, titled "Crafting Expertise with Your Own AI Technology: ChatGPT Deep Prompt Engineering & LangChain.")


As we delve into the era of Large Language Models (LLMs), a novel profession has emerged: that of the Prompt Engineer. Prompt Engineers design the precise instructions or queries to ensure that AI generates the desired output. Recently, the demand for this expertise has been on a marked rise.

To fully leverage the power of LLMs, crafting effective prompts is imperative. In this blog, we will explore the concept of Prompt Engineering using SolarLLM and examine methodologies to elicit optimal results from Generative AI.

Upstage's Solar LLM automatically generates natural and engaging text, ideal for chatbots, content creation, and any application requiring high-quality written content. Join our console and explore the world of prompt design with Solar LLM, all for free. Let’s get started!


What is Prompt Engineering?

A Prompt is the input that helps LLM produce the desired output. In simple terms, it is the instruction or conversation starter given to the AI model.

Prompt Engineering involves finding the optimal combination of prompt inputs to enhance the quality of the generated results. Essentially, it is the field that researches how to develop and optimize prompts to use large language models more effectively. Prompt Engineering helps LLM better understand queries and provide more accurate responses.


Components of Prompt Engineering

A prompt is typically composed of 4 main components:

  1. Instruction

    • A description of the specific task the LLM is expected to perform.

  2. Context

    • Supplementary information that helps the model provide better answers.

  3. Input Data

    • The question or input related to what you are seeking answers for.

  4. Output Indicator

    • Elements that indicate the type or format of the desired output.

In the above example, the user assigns LLM a persona (a fictional role or specific character trait to be reflected in the generated output), specifically as a "Job Application Guidance Bot" This belongs to the crucial component Instructions.

Moreover, to secure an efficacious response, it is paramount to include Context as a reference data and specific Output Indicators within the prompt. This approach ensures that when a user inputs data, LLM will generate the appropriate result.




Prompt Design Guide

Before embarking on the detailed process of composing a Prompt, let's first identify the key considerations for crafting an efficient and effective Prompt.


1. Keep it Concise and Clear

To leverage the capabilities of LLM effectively, it is crucial to input instructions that are as concise and clear as possible. Instead of providing an overly verbose explanation, it is more beneficial to capture the essence and include specific output indicators for clearer and more precise answers, as seen in the examples below.

👎 Where should I go for my summer vacation? I’m torn between the beach and the mountains. Although I’ve received recommendations for domestic destinations, I’m also drawn to international travel these days. Where should I go?

✅ Recommend three beautiful natural scenery destinations abroad that are ideal for traveling in August.


Similarly, adding explicit conditions such as "lower body muscle strengthening" and clear output indicators like "weekly workout routine" can significantly enhance the depth and quality of responses generated by LLM compared to vague prompts.

👎 I’m considering working out every day. What should I do?

✅ Create a weekly workout routine to strengthen lower body muscles.


2. Clearly Specify the Output Format

The second criterion for a well-crafted prompt involves the Output Indicators component that we reviewed earlier. This involves explicitly requesting the format in which the user's instructions should be delivered. You can specify anything from common formats such as quantity, character count, and line-by-line arrangement to more technical formats like JSON, tables, or HTML according to the user's requirements.

Example of output in table format:

Additionally, when utilizing LLMs for writing assistance or marketing, it is crucial to provide clear and specific output instructions to achieve high-quality results. If concise writing is needed, specifying the character count and tone (e.g., aphoristic, social media promotion, friendly, humorous, etc.) will help generate the desired outcomes efficiently. Ensure your prompt includes both concise and clear input instructions as well as specific output indicators to achieve effective results.

A case of prompt written based on specific output guidelines like "within 50 characters," "one line," or "promo tagline.” :



3. Provide Examples

If you have a clear idea of the desired output, giving examples can be very effective. By requesting LLM to produce a result using a sample output, the results will closely reflect your expectations.

👎 Create a fictional company. It would be great if you could also add a brief description of what the company does.

✅ Create a fictional fashion company called "AAA" using the format below.
- Company Name: Upstage
- Industry: System Software Development and Supply
- Products/Services: Development and supply of artificial intelligence (AI) systems, software development and production, and other system operation-related services.

Providing examples can also be useful for marketers exploring ideas or writing promotional content. You can include a sample sentence in the desired style and ask for promotional taglines, email drafts, outline creation, ideation, and various other tasks based on that example.



4. Specify Sections When Lengthy or Complex

When a prompt becomes extensive or complex, designating sections can help produce better results. In the example below, the context and output indicators are separated by brackets. Just as separating key information into sections makes it easier for us to understand at a glance, it also helps AI models process the information more efficiently.

Example of organizing long content into clearly designated sections:

👎 Example of a bad prompt design

✅ Example of good prompt design



5. Organize Tasks Sequentially

When leveraging LLM for diverse assignments, it is advantageous to delineate tasks in a sequential manner and specify the order of outputs. Rather than providing a vague instruction such as "Write a report based on the given information," it is more effective to decompose the tasks into distinct steps. For instance, you can instruct LLM to filter data that meets specific criteria, summarize the relevant content, or compose a concise report. This structured approach ensures that both the user's requirements and the quality of the output are satisfactorily met.

A crucial aspect to consider is that the tasks assigned to LLM should be clearly defined and easy for the model to interpret. Particularly for tasks involving qualitative assessments, providing explicit criteria or benchmarks that the model can apply is essential. Without such guidance, the precision and completeness of the generated output may be compromised.

Example of a prompt that organizes tasks in sequence:

6. Iteration and Continuous Testing

The final aspect of Prompt Engineering we address is the importance of iteration and continuous testing. Since Prompt Engineering involves finding the optimal combination of input data, achieving a perfect prompt on the first try is unlikely for anything beyond simple tasks. Therefore, to create a high-quality prompt, it is essential to regularly use LLM, identify any deficiencies, make necessary modifications, and persistently test.

If you are unsatisfied with the results generated by your prompt, consider revising it based on the following suggestions. Understanding how to construct a prompt and experimenting with different approaches will help you cultivate your expertise.

  • Change the style

  • Set the length

  • Emphasize different aspects (e.g., highlight the price more in a promotional tagline)

  • Rewrite based on several examples

  • Start fresh if the conversation becomes lengthy (e.g., when you don't want to consider previous chat content)

  • Place or repeat important information at the end

  • Write in English if necessary

  • Use Markdown for formatting

  • Remember that the output may vary depending on the version of GPT

  • Use code if you want consistent results for the same input

We have now explored what Prompt Engineering entails and how to design effective prompts to fully harness the capabilities of LLMs. Use the various Prompt Design guidelines we've discussed to experiment directly with Solar!